Nystrom Method for Approximating the GMM Kernel

نویسنده

  • Ping Li
چکیده

The GMM (generalized min-max) kernel was recently proposed [5] as a measure of data similarity and was demonstrated effective in machine learning tasks. In order to use the GMM kernel for large-scale datasets, the prior work resorted to the (generalized) consistent weighted sampling (GCWS) to convert the GMM kernel to linear kernel. We call this approach as “GMM-GCWS”. In the machine learning literature, there is a popular algorithm which we call “RBF-RFF”. That is, one can use the “random Fourier features” (RFF) to convert the “radial basis function” (RBF) kernel to linear kernel. It was empirically shown in [5] that RBF-RFF typically requires substantially more samples than GMM-GCWS in order to achieve comparable accuracies. The Nystrom method is a general tool for computing nonlinear kernels, which again converts nonlinear kernels into linear kernels. We apply the Nystrom method for approximating the GMM kernel, a strategy which we name as “GMM-NYS”. In this study, our extensive experiments on a set of fairly large datasets confirm that GMM-NYS is also a strong competitor of RBF-RFF.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized Intersection Kernel

Following the very recent line of work on the “generalized min-max” (GMM) kernel [7], this study proposes the “generalized intersection” (GInt) kernel and the related “normalized generalized min-max” (NGMM) kernel. In computer vision, the (histogram) intersection kernel has been popular, and the GInt kernel generalizes it to data which can have both negative and positive entries. Through an ext...

متن کامل

An effective method for approximating the solution of singular integral equations with Cauchy kernel type

In present paper, a numerical approach for solving Cauchy type singular integral equations is discussed. Lagrange interpolation with Gauss Legendre quadrature nodes and Taylor series expansion are utilized to reduce the computation of integral equations into some algebraic equations. Finally, five examples with exact solution are given to show efficiency and applicability of the method. Also, w...

متن کامل

Kernel Weighted GMM Estimators for Linear Time Series Models

This paper analyzes the higher order asymptotic properties of Generalized Method of Moments (GMM) estimators for linear time series models using many lags as instruments. A data dependent moment selection method based on minimizing the approximate mean squared error is developed. In addition, a new version of the GMM estimator based on kernel weighted moment conditions is proposed. It is shown ...

متن کامل

Kernel Trick Embedded Gaussian Mixture Model

In this paper, we present a kernel trick embedded Gaussian Mixture Model (GMM), called kernel GMM. The basic idea is to embed kernel trick into EM algorithm and deduce a parameter estimation algorithm for GMM in feature space. Kernel GMM could be viewed as a Bayesian Kernel Method. Compared with most classical kernel methods, the proposed method can solve problems in probabilistic framework. Mo...

متن کامل

Kernel GMM and its application to image binarization

Gaussian Mixture Model (GMM) is an efficient method for parametric clustering. However, traditional GMM can’t perform clustering well on data set with complex structure such as images. In this paper, kernel trick, successfully used by SVM and kernel PCA, is introduced into EM algorithm for solving parameter estimation of GMM, which is so called kernel GMM (kGMM). The basic idea of kernel GMM is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1607.03475  شماره 

صفحات  -

تاریخ انتشار 2016